16 research outputs found

    SoK: Chasing Accuracy and Privacy, and Catching Both in Differentially Private Histogram Publication

    Get PDF
    Histograms and synthetic data are of key importance in data analysis. However, researchers have shown that even aggregated data such as histograms, containing no obvious sensitive attributes, can result in privacy leakage. To enable data analysis, a strong notion of privacy is required to avoid risking unintended privacy violations.Such a strong notion of privacy is differential privacy, a statistical notion of privacy that makes privacy leakage quantifiable. The caveat regarding differential privacy is that while it has strong guarantees for privacy, privacy comes at a cost of accuracy. Despite this trade-off being a central and important issue in the adoption of differential privacy, there exists a gap in the literature regarding providing an understanding of the trade-off and how to address it appropriately. Through a systematic literature review (SLR), we investigate the state-of-the-art within accuracy improving differentially private algorithms for histogram and synthetic data publishing. Our contribution is two-fold: 1) we identify trends and connections in the contributions to the field of differential privacy for histograms and synthetic data and 2) we provide an understanding of the privacy/accuracy trade-off challenge by crystallizing different dimensions to accuracy improvement. Accordingly, we position and visualize the ideas in relation to each other and external work, and deconstruct each algorithm to examine the building blocks separately with the aim of pinpointing which dimension of accuracy improvement each technique/approach is targeting. Hence, this systematization of knowledge (SoK) provides an understanding of in which dimensions and how accuracy improvement can be pursued without sacrificing privacy

    SoK: Chasing Accuracy and Privacy, and Catching Both in Differentially Private Histogram Publication

    Get PDF
    Histograms and synthetic data are of key importance in data analysis. However, researchers have shown that even aggregated data such as histograms, containing no obvious sensitive attributes, can result in privacy leakage. To enable data analysis, a strong notion of privacy is required to avoid risking unintended privacy violations. Such a strong notion of privacy is differential privacy, a statistical notion of privacy that makes privacy leakage quantifiable. The caveat regarding differential privacy is that while it has strong guarantees for privacy, privacy comes at a cost of accuracy. Despite this trade off being a central and important issue in the adoption of differential privacy, there exists a gap in the literature regarding providing an understanding of the trade off and how to address it appropriately. Through a systematic literature review (SLR), we investigate the state-of-the-art within accuracy improving differentially private algorithms for histogram and synthetic data publishing. Our contribution is two-fold: 1) we identify trends and connections in the contributions to the field of differential privacy for histograms and synthetic data and 2) we provide an understanding of the privacy/accuracy trade off challenge by crystallizing different dimensions to accuracy improvement. Accordingly, we position and visualize the ideas in relation to each other and external work, and deconstruct each algorithm to examine the building blocks separately with the aim of pinpointing which dimension of accuracy improvement each technique/approach is targeting. Hence, this systematization of knowledge (SoK) provides an understanding of in which dimensions and how accuracy improvement can be pursued without sacrificing privacy

    Privacy-aware Use of Accountability Evidence

    No full text
    This thesis deals with the evidence that enable accountability, the privacy risks involved in using them and a privacy-aware solution to the problem of unauthorized evidence disclosure.  Legal means to protect privacy of an individual is anchored on the data protection perspective i.e., on the responsible collection and use of personal data. Accountability plays a crucial role in such legal privacy frameworks for assuring an individual’s privacy. In the European context, accountability principle is pervasive in the measures that are mandated by the General Data Protection Regulation. In general, these measures are technically achieved through automated privacy audits. System traces that record the system activities are the essential inputs to those automated audits. Nevertheless, the traces that enable accountability are themselves subject to privacy risks, because in most cases, they inform about processing of the personal data. Therefore, ensuring the privacy of the accountability traces is equally important as ensuring the privacy of the personal data. However, by and large, research involving accountability traces is concerned with storage, interoperability and analytics challenges rather than on the privacy implications involved in processing them. This dissertation focuses on both the application of accountability evidence such as in the automated privacy audits and the privacy aware use of them. The overall aim of the thesis is to provide a conceptual understanding of the privacy compliance research domain and to contribute to the solutions that promote privacy-aware use of the traces that enable accountability. To address the first part of the objective, a systematic study of existing body of knowledge on automated privacy compliance is conducted. As a result, the state-of-the-art is conceptualized as taxonomies. The second part of the objective is accomplished through two results; first, a systematic understanding of the privacy challenges involved in processing of the system traces is obtained, second, a model for privacy aware access restrictions are proposed and formalized in order to prevent illegitimate access to the system traces. Access to accountability traces such as provenance are required for automatic fulfillment of accountability obligations, but they themselves contain personally identifiable information, hence in this thesis we provide a solution to prevent unauthorized access to the provenance traces.This thesis deals with the evidence that enables accountability, the privacy risks involved in using it and proposes a privacy-aware solution for preventing unauthorized evidence disclosure. Accountability plays a crucial role in the legal privacy frameworks for assuring individuals’ privacy.  In the European context, accountability principle is pervasive in the measures that are mandated by the General Data Protection Regulation. In general, these measures are technically achieved through automated privacy audits. Traces that record the system activities are the essential inputs to those audits. Nevertheless, such traces that enable accountability are themselves subject to privacy risks, because in most cases, they inform about the processing of the personal data. Therefore, ensuring the privacy of the traces is equally important as ensuring the privacy of the personal data. The aim of the thesis is to provide a conceptual understanding of the automated privacy compliance research and to contribute to the solutions that promote privacy-aware use of the accountability traces. This is achieved in this dissertation through a systematic study of the existing body of knowledge in automated privacy compliance, a systematic analysis of the privacy challenges involved in processing the traces and a proposal of a privacy-aware access control model for preventing illegitimate access to the traces

    Go the Extra Mile for Accountability : Privacy Protection Measures for Emerging Information Management Systems

    No full text
    The thesis considers a systematic approach to design and develop techniques for preventing personal data exposure in next generation information management systems with the aim of ensuring accountability of data controllers (entities that process personal data). With a rapid growth in the communication technologies, heterogenous computing environments that offer cost-effective data processing alternatives are emerging. Thus, the information-flow of personal data spans beyond the information processing practices of data controllers thereby involving other parties that process personal data. Moreover, in order to enable interoperability, data in such environments is given well-defined structure and meaning by means of graph-based data models. Graphs, inherently emphasize connections between things, and when graphs are used to model personal data records, the connections and the network structure may reveal intimate details about our inter-connected society. In the European context, the General Data Protection Regulation (GDPR) provides a legal framework for personal data processing. The GDPR stipulates specific consequences for non-compliance to the data protection principles, in the view of ensuring accountability of data controllers in their personal data processing practices. Widely recognized approaches to implement the Privacy by Design (PbD) principle in the software application development process, are broader in scope. Hence, processes to implement personal data protection techniques for specific systems are not the central aspect of the aforementioned approaches. In order to influence the implementation of techniques for preventing personal data misuse associated with sharing of data represented as graphs, a conceptual mechanism for building privacy techniques is developed. The conceptual mechanism consists of three elements, namely, a risk analysis for Semantic Web information management systems using Privacy Impact Assessment (PIA) approach, two privacy protection techniques for graphs enriched with semantics and a model to approach evaluation of adherence to the goals resulted from the risk analysis. The privacy protection techniques include an access control model that embodies purpose limitation principle—an essential aspect of GDPR—and adaptations of the differential privacy model for graphs with edge labels. The access control model takes into account the semantics of the graph elements for authorizing access to the graph data. In our differential privacy adaptations, we define and study through experiments, four different approaches to adapt the differential privacy model to edge-labeled graph datasets.The thesis considers a systematic approach to design and develop techniques for preventing personal data exposure in next generation information management systems with the aim of ensuring accountability of data controllers (entities that process personal data). With a rapid growth in the communication technologies, heterogenous computing environments that offer cost-effective data processing alternatives are emerging. Thus, the information-flow of personal data spans beyond the information processing practices of data controllers thereby involving other parties that process personal data. Moreover, in order to enable interoperability, data in such environments is given well-defined structure and meaning by means of graph-based data models. Graphs, inherently emphasize connections between things, and when graphs are used to model personal data records, the connections and the network structure may reveal intimate details about our inter-connected society. The GDPR stipulates specific consequences for non-compliance to the data protection principles, in the view of ensuring accountability of data controllers in their personal data processing practices. Widely recognized approaches to implement the Privacy by Design (PbD) principle in the software application development process, are broader in scope. Hence, processes to implement privacy techniques for specific systems are not the central aspect of the aforementioned approaches. In order to influence the implementation of techniques for preventing personal data misuse associated with sharing of data represented as graphs, a conceptual mechanism for building privacy techniques is developed. The conceptual mechanism consists of three elements, namely, a risk analysis for Semantic Web information management systems using Privacy Impact Assessment (PIA) approach, two privacy protection techniques for graphs enriched with semantics and a model to approach evaluation of adherence to the goals resulted from the risk analysis.Article 5 part of thesis as manuscript, now published.</p

    Implementation Analysis of Trusted Network Connect to Ensure Endpoint Integrity and Security

    No full text
    Change in business practices, massive growth of malware treats drives the enterprises to enforce a strong authentication system rather light weight username/password authentication system for network access. The authentication systems based on username/password would safe guard against a malicious user connecting to a corporate network, but fails to prevent an authorized user indulging in malicious activities inside the network. Network Access Control (NAC) is a paradigm that consists of software and hardware components to dynamically control an endpoint's network access based on its compliance level with respect to an enterprise's policy. Many of the available Network Access Control (NAC) solutions are proprietary and closed. Trusted Network Connect specifications from Trusted Computing Group is a standard specification describing protocols, interfaces and API(s) for a potential NAC system to ensure integrity of an endpoint during network connection. Similar with other proprietary NAC solutions, products that support TNC specifications are proprietary and closed. This work investigates the possibility of implementing a TNC NAC system with the aid of available open source components. The principal design aspect of TNC specification is it leverages existing infrastructures, this work focuses on adapting TNC specifications over 802.1X networks. To emphasis this, an abstract system design was developed to analysis the challenges and possibilities in implementing a TNC NAC. Then the study continues with analyzing the modifications and changes that should be carried out on the reference architecture to support TNC. This work serves as a base for developing a TNC based NAC over 802.1X environment

    Survey of Differentially Private Accuracy Improving Techniques for Publishing Histograms and Synthetic Data

    No full text
    Drawing insights from data sets provide enormous social value. However, privacy violations are major impediments to access these data sources. For example, messages in social platforms is a valuable source for a social science researcher who wants to understand how individuals fleeing from wars organize themselves but access to such information jeopardize the privacy of those individuals. Differential Privacy (DP) is a privacy model that provide formal guarantees to the individuals that their participation in the data set is \u27nearly\u27 hidden. Differential privacy definition states that the results of an analysis on a data set remain \u27essentially\u27 the same whether or not an individual participates/does not participates in the data set. To this end, some differentially private analyses add noise to the output for obfuscating the contribution of any individual. The magnitude of the added noise is inversely proportional to the privacy guarantee. In this work we focus on the analyses that produce histograms or synthetic data. Histograms and synthetic data are interesting to study because they provide a sanitized form of the original data for which access is restricted. There is growing interest in the scientific community to demonstrate different ways to improve the accuracy of differentially private histograms or synthetic data. However, there are a few work that systematize the knowledge gained by those scientific investigations. Thus our aim is to analyze and structure the state-of-art techniques that improve the accuracy of histograms or synthetic data published under differential privacy. We used the systematic literature review as the research method to summarize the state-of-the-art. Our preliminary result that categorize the state-of-the art is illustrated in Figure 1

    'Culture Makes You Stronger' Aboriginal women's voices from the South Coast of NSW

    Get PDF
    This presentation focuses on health and wellbeing research being undertaken with the Waminda Women's Health Organisation in the Shoalhaven, New South Wales
    corecore